A Newton-like Algorithm for Likelihood Maximization: the Robust-variance Scoring Algorithm

نویسندگان

  • Daniel Commenges
  • Hélène Jacqmin-Gadda
  • Cécile Proust
چکیده

This article considerss a Newton-like method already used by several authors but which has not been thouroughly studied yet. We call it the robust-variance scoring (RVS) algorithm because the main version of the algorithm that we consider replaces minus the Hessian of the loglikelihood used in the 1 Newton-Raphson algorithm by a matrix G which is an estimate of the variance of the score under the true probability, which uses only the individual scores. Thus an iteration of this algorithm requires much less computations than an iteration of the Newton-Raphson algorithm. Moreover this estimate of the variance of the score estimates the information matrix at maximum. We have also studied a convergence criterion which has the nice interpretation of estimating the ratio of the approximation error over the statistical error; thus it can be used for stopping the iterative process whatever the problem. A simulation study confirms that the RVS algorithm is faster than the Mar-quardt algorithm (a robust version of the Newton-Raphson algorithm); this happens because the number of iterations needed by the RVS algorithm is barely larger than that needed by the Marquardt algorithm while the computation time for each iteration is much shorter. Also the coverage rates using the matrix G are satisfactory.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Robust Method for E-Maximization and Hierarchical Clustering of Image Classification

We developed a new semi-supervised EM-like algorithm that is given the set of objects present in eachtraining image, but does not know which regions correspond to which objects. We have tested thealgorithm on a dataset of 860 hand-labeled color images using only color and texture features, and theresults show that our EM variant is able to break the symmetry in the initial solution. We compared...

متن کامل

SIZING OPTIMIZATION OF TRUSS STRUCTURES WITH NEWTON META-HEURISTIC ALGORITHM

This study is devoted to discrete sizing optimization of truss structures employing an efficient discrete evolutionary meta-heuristic algorithm which uses the Newton gradient-based method as its updating scheme and it is named here as Newton Meta-heuristic Algorithm (NMA). In order to enable the NMA population-based meta-heuristic to effectively explore the discrete design space, a term contain...

متن کامل

Numerical Optimization within Vector of Parameters Estimation in Volatility Models

In this paper usefulness of quasi-Newton iteration procedure in parameters estimation of the conditional variance equation within BHHH algorithm is presented. Analytical solution of maximization of the likelihood function using first and second derivatives is too complex when the variance is time-varying. The advantage of BHHH algorithm in comparison to the other optimization algorithms is that...

متن کامل

Computational aspects of probit model

Sometimes the maximum likelihood estimation procedure for the probit model fails. There may be two reasons: the maximum likelihood estimate (MLE) just does not exist or computer overflow error occurs during the computation of the cumulative distribution function (cdf). For example, the approximation explosive effect due to an inaccurate computation of the cdf for a large value of the argument o...

متن کامل

On using generalized Cramér-Rao inequality to REML estimation in linear models

The main aim of considerations in the problem of estimation of variance components σ 1 and σ 2 2 by using the ML-method and REML-method in normal mixed linear model N{Y, E(Y ) = Xβ, Cov(Y ) = σ 1V + σ 2In} was concerned in the examination of theirs efficiency. It is particularly important when an explicit form of these estimators is unknown and we search for the solutions of the likelihood equa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008